1,486 research outputs found

    A coupled multi-scale approach for the simulation of textile membranes

    Get PDF
    The presented work deals with the simulation of problems related to dry fabric materials. Especially draping over double curved molds is a huge field for industrial simulation methods. Although industrial solutions are available, there are still many open issues. The main reason for these issues is the fact that the mechanical behavior of dry fabric layers is not describable with a standard continuum mechanical approach because the fabric is not a continuum. The idea of the approach presented here is to model the inner structure of the fabric with a unit cell consisting of crossed beams and to couple this inner structure with a macroscopic membrane element (coupled multi-scale approach)

    P3b reflects periodicity in linguistic sequences

    Get PDF
    Temporal predictability is thought to affect stimulus processing by facilitating the allocation of attentional resources. Recent studies have shown that periodicity of a tonal sequence results in a decreased peak latency and a larger amplitude of the P3b compared with temporally random, i.e., aperiodic sequences. We investigated whether this applies also to sequences of linguistic stimuli (syllables), although speech is usually aperiodic. We compared aperiodic syllable sequences with two temporally regular conditions. In one condition, the interval between syllable onset was fixed, whereas in a second condition the interval between the syllables’ perceptual center (p-center) was kept constant. Event-related potentials were assessed in 30 adults who were instructed to detect irregularities in the stimulus sequences. We found larger P3b amplitudes for both temporally predictable conditions as compared to the aperiodic condition and a shorter P3b latency in the p-center condition than in both other conditions. These findings demonstrate that even in acoustically more complex sequences such as syllable streams, temporal predictability facilitates the processing of deviant stimuli. Furthermore, we provide first electrophysiological evidence for the relevance of the p-center concept in linguistic stimulus processing

    Computing on Masked Data to improve the Security of Big Data

    Full text link
    Organizations that make use of large quantities of information require the ability to store and process data from central locations so that the product can be shared or distributed across a heterogeneous group of users. However, recent events underscore the need for improving the security of data stored in such untrusted servers or databases. Advances in cryptographic techniques and database technologies provide the necessary security functionality but rely on a computational model in which the cloud is used solely for storage and retrieval. Much of big data computation and analytics make use of signal processing fundamentals for computation. As the trend of moving data storage and computation to the cloud increases, homeland security missions should understand the impact of security on key signal processing kernels such as correlation or thresholding. In this article, we propose a tool called Computing on Masked Data (CMD), which combines advances in database technologies and cryptographic tools to provide a low overhead mechanism to offload certain mathematical operations securely to the cloud. This article describes the design and development of the CMD tool.Comment: 6 pages, Accepted to IEEE HST Conferenc

    Close-to-process compensation of geometric deviations on implants based on optical measurement data

    Get PDF
    The production of implants is challenging due to their complex shapes, the filigree structures and the great regulatory effort. Therefore, a manufacturing cell with an integrated optical measurement was realized. The measurement system is used to determine the geometric deviations and to fulfill the documentation obligation. The measurement data are used to create a matrix containing the nominal coordinates and the error vector for compensation points at the relevant shapes. Based on this, the corresponding tool-path segments are isolated in the G-Code and compensated in order to reduce the geometric deviations. With this method, the deviation could be reduced by 85 %. However, it is pointed out, that the result of the compensation strongly depends on the quality of the optical measurement data

    Biases in, and corrections to, KSB shear measurements

    Get PDF
    We analyse the KSB method to estimate gravitational shear from surface-brightness moments of small and noisy galaxy images. We identify three potentially problematic assumptions. These are: (1) While gravitational shear must be estimated from averaged galaxy images, KSB derives a shear estimate from each individual image and then takes the average. Since the two operations do not commute, KSB gives biased results. (2) KSB implicitly assumes that galaxy ellipticities are small, while weak gravitational lensing assures only that the change in ellipticity due to the shear is small. (3) KSB does not invert the convolution with the point-spread function, but gives an approximate PSF correction which - even for a circular PSF - holds only in the limit of circular sources. The effects of assumptions (2) and (3) partially counter-act in a way dependent on the width of the weight function and of the PSF. We quantitatively demonstrate the biases due to all assumptions, extend the KSB approach consistently to third order in the shear and ellipticity and show that this extension lowers the biases substantially. The issue of proper PSF deconvolution will be addressed in a forthcoming paper.Comment: 12 pages, 10 figures, MNRAS submitte

    Abschlussbericht zur Umfrage an der Universitätsbibliothek Mannheim 2016

    Get PDF
    Unter dem Motto „Sagen Sie uns Ihre Meinung“ führte die Universitätsbibliothek Mannheim im Oktober 2016 eine große Online-Befragung zu ihrem gesamten Serviceangebot durch. Ziel der Umfrage war es, die aktuelle Zufriedenheit der NutzerInnen mit der Bibliothek abzufragen und die Erwartungen hinsichtlich zukünftiger Dienstleistungen zu ermitteln. Dieser Bericht präsentiert die wesentlichen Ergebnisse und Schlussfolgerungen aus Sicht der UB Mannheim

    Performance evaluation of an automated and continuous antibody purification process in a side-by-side comparability study

    Get PDF
    Continuous manufacturing (CM) introduces the benefits of cost efficiency, reliability and scalability for the manufacturing of biopharmaceuticals. Higher flexibility, smaller facility footprints and cost of goods benefits are advantages of this production mode. It offers high flexibility in regard of demand changes from clinical to launch and for volatile market dynamics. In combination with disposable equipment, faster time-to-market and closed processing seems feasible. Bayer´s unique CM platform consists of a series of downstream processing (DSP) unit operations through which the drug substance moves continuously and all unit operations happen more or less in parallel at the same time. The technology offers the potential to make Quality by Design (QbD) a reality (with continuously monitored process parameters and real-time feedback process control to maintain quality-indicating parameters within limits at all times, multi-variate data analysis). Individual unit operations are intelligently integrated and critical process parameters are monitored and controlled in real-time. Conditioning modules allow immediate corrective actions to be executed in an automated fashion to maintain the entire process in a state of control with low batch-to-batch variability. In addition, online sampling and testing functions provide early warning of potential excursions. By reduced manual interference this will also lead to reduction of operator errors and according deviations. Manufacturing facilities will be significantly less capital-intensive (e.g. by simpler layout) than large, traditional batch facilities as disposable technology and aseptic connections offer superior protection against bioburden ingress and other forms of contamination. The presentation also intends to illustrate comparability of CM versus batch processing in a side-by-side approach covering process information, real time analysis as well as quality data from intermediates and final drug substance of an antibody product. Please click Additional Files below to see the full abstract

    The Flux Auto- and Cross-Correlation of the Lyman-alpha Forest. II. Modelling Anisotropies with Cosmological Hydrodynamic Simulations

    Full text link
    The isotropy of the Lyman-alpha forest in real-space uniquely provides a measurement of cosmic geometry at z > 2. The angular diameter distance for which the correlation function along the line of sight and in the transverse direction agree corresponds to the correct cosmological model. However, the Lyman-alpha forest is observed in redshift-space where distortions due to Hubble expansion, bulk flows, and thermal broadening introduce anisotropy. Similarly, a spectrograph's line spread function affects the autocorrelation and cross-correlation differently. In this the second paper of a series on using the Lyman-alpha forest observed in pairs of QSOs for a new application of the Alcock-Paczynski (AP) test, these anisotropies and related sources of potential systematic error are investigated with cosmological hydrodynamic simulations. Three prescriptions for galactic outflow were compared and found to have only a marginal effect on the Lyman-alpha flux correlation (which changed by at most 7% with use of the currently favored variable-momentum wind model vs. no winds at all). An approximate solution for obtaining the zero-lag cross-correlation corresponding to arbitrary spectral resolution directly from the zero-lag cross-correlation computed at full-resolution (good to within 2% at the scales of interest) is presented. Uncertainty in the observationally determined mean flux decrement of the Lyman-alpha forest was found to be the dominant source of systematic error; however, this is reduced significantly when considering correlation ratios. We describe a simple scheme for implementing our results, while mitigating systematic errors, in the context of a future application of the AP test.Comment: 20 page

    Lithium Pollution of White Dwarfs and Other Secrets of MORDOR

    Get PDF
    Running diagonally down the middle of the Hertzsprung-Russell Diagram (H-R Diagram) between the white dwarf cooling track and the lower main sequence lies a less frequently studied region of the H-R Diagram populated by dim objects. We have termed this area the Medial Overlooked Region’s Dim Object Range (MORDOR), and we present the results of an optical spectroscopic survey of 83 objects in MORDOR using the Goodman Spectrograph mounted on the SOAR telescope. We discovered a variety of new and interesting objects including white dwarf-M dwarf binaries that could reveal the composition of M dwarf wind. We performed a thorough analysis of another MORDOR class: the white dwarfs with metal lines (DZs). These DZs have accreted planetesimals, which can reveal the elemental abundances of rocky bodies in exoplanetary systems. Those abundances provide information on the composition of the nebulae from which the systems formed and the geological processes that the rocky material experienced. A DZ we found in MORDOR and a second DZ outside MORDOR exhibited the first known lithium and potassium in white dwarfs. We find their lithium abundances to be elevated relative to the baseline for the Solar System, which we propose was a consequence of the accreted planetesimals originating from the earliest epochs of the Galaxy, when the ratio of lithium to heavier elements was high due to Big Bang nucleosynthesis. Another observer discovered three additional white dwarfs that are polluted by lithium from extrasolar planetesimals and proposed the lithium excess was caused by the accretion of continental crust. We analyze the abundances of all five lithium-bearing white dwarfs in the context of the two competing hypotheses: planetary differentiation and the nucleosynthetic evolution of the Galaxy. For three white dwarfs, we can rule out the accretion of continental crust. Whereas, all but the one magnetic white dwarf are consistent with expectations from Big Bang and Galactic nucleosynthesis. In the future with further discoveries and the means to constrain the protostellar metallicities of the systems, lithium-polluted white dwarfs will allow for the first proper test of the predicted Big Bang nucleosynthetic lithium abundance.Doctor of Philosoph
    • …
    corecore